45 research outputs found

    Riesz measures and Wishart laws associated to quadratic maps

    Get PDF
    We introduce a natural definition of Riesz measures and Wishart laws associated to an Ω\Omega-positive (virtual) quadratic map, where Ωn\Omega \subset \real^n is a regular open convex cone. We give a general formula for moments of the Wishart laws. Moreover, if the quadratic map has an equivariance property under the action of a linear group acting on the cone Ω\Omega transitively, then the associated Riesz measure and Wishart law are described explicitly by making use of theory of relatively invariant distributions on homogeneous cones

    Wishart laws and variance function on homogeneous cones

    Full text link
    We present a systematic study of Riesz measures and their natural exponential families of Wishart laws on a homogeneous cone. We compute explicitly the inverse of the mean map and the variance function of a Wishart exponential family.Comment: 24 pages, Probab. Math. Statist (2019

    Joint Group Invariant Functions on Data-Parameter Domain Induce Universal Neural Networks

    Full text link
    The symmetry and geometry of input data are considered to be encoded in the internal data representation inside the neural network, but the specific encoding rule has been less investigated. In this study, we present a systematic method to induce a generalized neural network and its right inverse operator, called the ridgelet transform, from a joint group invariant function on the data-parameter domain. Since the ridgelet transform is an inverse, (1) it can describe the arrangement of parameters for the network to represent a target function, which is understood as the encoding rule, and (2) it implies the universality of the network. Based on the group representation theory, we present a new simple proof of the universality by using Schur's lemma in a unified manner covering a wide class of networks, for example, the original ridgelet transform, formal deep networks, and the dual voice transform. Since traditional universality theorems were demonstrated based on functional analysis, this study sheds light on the group theoretic aspect of the approximation theory, connecting geometric deep learning to abstract harmonic analysis.Comment: NeurReps 202

    Model selection in the space of Gaussian models invariant by symmetry

    Full text link
    We consider multivariate centred Gaussian models for the random variable Z=(Z1,,Zp)Z=(Z_1,\ldots, Z_p), invariant under the action of a subgroup of the group of permutations on {1,,p}\{1,\ldots, p\}. Using the representation theory of the symmetric group on the field of reals, we derive the distribution of the maximum likelihood estimate of the covariance parameter Σ\Sigma and also the analytic expression of the normalizing constant of the Diaconis-Ylvisaker conjugate prior for the precision parameter K=Σ1K=\Sigma^{-1}. We can thus perform Bayesian model selection in the class of complete Gaussian models invariant by the action of a subgroup of the symmetric group, which we could also call complete RCOP models. We illustrate our results with a toy example of dimension 44 and several examples for selection within cyclic groups, including a high dimensional example with p=100p=100.Comment: 34 pages, 4 figures, 5 table
    corecore